# Abstractive Summarization
T5 Abstractive
This is a T5 model fine-tuned for Indonesian text summarization tasks, developed by the SumText team at Bina Nusantara University.
Text Generation Other
T
migz117
23
0
Bart Log Summarization
MIT
A BART-base architecture fine-tuned model for log summarization, specifically designed for abstractive summarization of software logs
Text Generation
Transformers

B
VidhuMathur
660
4
Bart Conversation Summarizer
A dialogue summarization model based on the BART architecture, specifically fine-tuned for the SAMSum dialogue dataset
Text Generation
Transformers

B
Azma-AI
39
6
Mlong T5 Large Sumstew
Apache-2.0
This is a multilingual, long-text (supports up to 16k input tokens) abstractive summarization model. Trained on the sumstew dataset, it can generate titles and summaries for given input documents.
Text Generation
Transformers Supports Multiple Languages

M
Joemgu
103
9
Arman Longformer 8k
A Persian text summarization model based on the Longformer attention mechanism, supporting input lengths of up to 8K tokens.
Text Generation
Transformers Other

A
zedfum
17
0
Finetuned Bart For Conversation Summary
A dialogue summarization model based on the BART architecture, specifically fine-tuned for the SAMSum corpus, capable of generating high-quality dialogue summaries.
Text Generation
Transformers

F
kabita-choudhary
369
53
Ptt5 Base Summ Temario
MIT
A model fine-tuned based on PTT5 for generating abstractive summaries of Brazilian Portuguese texts.
Text Generation
Transformers Other

P
recogna-nlp
159
1
T5 Base Standard Bahasa Cased
This is a pre-trained T5 base standard Malay language model, trained through multitasking and supports various natural language processing tasks.
Large Language Model
Transformers Other

T
mesolitica
33
0
Distill Pegasus Cnn 16 4
PEGASUS is an abstractive summarization model pre-trained with gap sentences, developed by Google Research.
Text Generation
Transformers English

D
sshleifer
286
4
Pegasus Large
PEGASUS is an abstractive summarization model based on pre-training with gap sentences, developed by Google Research.
Text Generation English
P
google
43.35k
103
Meeting Summary Samsum
Apache-2.0
This model is a seq2seq model based on the BART architecture, specifically designed for dialogue summarization tasks, fine-tuned on the SAMSum dataset.
Text Generation
Transformers English

M
knkarthick
25
9
Legal Pegasus
MIT
A PEGASUS fine-tuned model for legal domain text summarization, specifically designed for generating abstractive summaries of legal documents
Text Generation
Transformers English

L
nsi319
2,658
12
MEETING SUMMARY BART LARGE XSUM SAMSUM DIALOGSUM AMI
Apache-2.0
A sequence-to-sequence model based on the BART architecture, specifically fine-tuned for meeting and dialogue summarization tasks, capable of generating abstractive summaries from various dialogue data.
Text Generation
Transformers English

M
knkarthick
119
15
Bart Large Xsum Samsum
Apache-2.0
This model is a BART-large-xsum model fine-tuned on the Samsum dataset, specifically designed for dialogue summarization tasks.
Text Generation
Transformers English

B
knkarthick
14
0
Pegasus Billsum
PEGASUS is an abstractive summarization pre-trained model based on gap sentence extraction, focused on generating high-quality text summaries.
Text Generation
Transformers English

P
google
295
4
Pegasus Reddit Tifu
PEGASUS is a pretrained model based on extracting gap sentences, specifically designed for abstractive summarization tasks.
Text Generation
Transformers English

P
google
17
3
Featured Recommended AI Models